Web Survey Bibliography
This paper reports two studies investigating the impact of online survey sponsorship on dropout rates. In Study 1, 498 participants were randomised to one of four 78-item online surveys. All four surveys were identical in content, but differed in presentation format. The first pair of surveys were hosted on our faculty web-server using LimeSurvey, and preceded by an information page on our school website. Our university logo was featured prominently on every page of these surveys, representing a “high” level of university sponsorship. The remaining pair were entirely hosted on SurveyMonkey.com, and made minimal reference to our university (i.e., they represented a “low” level of university sponsorship). One version of each pair “forced” participants to answer every question on each page before continuing, whereas the other did not (i.e., all questions were “optional”). Overall, 13.9% of participants commenced, but did not complete the surveys. The proportion of participants completing the high sponsorship surveys did not differ from the proportion completing the low sponsorship surveys. Of those who completed the optional format surveys, members of the low sponsorship condition answered significantly more items than high sponsorship condition members. There was no such difference between members of the high and low conditions who did not complete the optional format surveys. However, LimeSurvey and SurveyMonkey differ in terms of basic page formatting, load speeds and several other factors, which could be responsible for these findings. These confounds were addressed in Study 2, in which 159 participants were randomised to one of two 65-item online surveys. Both were identical in content, utilised an optional response format, and were hosted on Qualtrics.com. The first survey represented a high level of university sponsorship, was preceded by an information page on our school website, and had the university name and logo featured prominently on every page, and in the survey URL. The second survey did not possess these characteristics, and represented a low level of university sponsorship. Overall, 23.9% of participants commenced, but did not complete the surveys. The proportion of participants that completed the high sponsorship survey did not differ to the proportion that completed the low sponsorship survey. Furthermore, the numbers of items answered by participants who completed the two surveys were equivalent. Overall, although it is disappointing that dropout rates cannot be reduced simply by enhancing academic survey sponsor visibility, researchers without ready access to university web-servers or logos will appreciate these findings.
Conference Homepage (abstract)
Web survey bibliography - 2012 (371)
- The Impact of Academic Sponsorship on Online Survey Dropout Rates; 2012; Allen, P. J., Roberts, L. D.
- Especially for You: Motivating Respondents in an Internet Panel by Offering Tailored Questions; 2012; Oudejans, M.
- Social media as a data collection tool: the impact of Facebook in behavioural research; 2012; Zoppos, E.
- Smartphone Apps and User Engagement: Collecting Data in the Digital Era; 2012; Link, M. W.
- Snowball Sampling in Online Social Networks; 2012; Raissi, M., Ackland, R.
- The Use of Facebook as a Locating and Contacting Tool; 2012; McCarthy, T.
- How Often Do You Use the App with a Bird on It? Exploring Differences in Survey Completion Times, Primacy...; 2012; Buskirk, T. D.
- Data quality of questions sensitive to social-desirability bias in web surveys; 2012; Lozar Manfreda, K., Zajc, N., Berzelak, N., Vehovar, V.
- Online Questionnaires: Development of ‘basic requirements’; 2012; Tries, S., Blanke, K.
- Social research in online context: methodological reflections on web surveys from a case study; 2012; Pandolfini, V.
- Efficacy of a health-related Facebook social network site on health-seeking behaviors; 2012; Woolley, P., Peterson, M.
- The war against unengaged online respondents; 2012; Gittelman, S. H., Trimarchi, E.
- Qualitatively Speaking: The five absolute, no-excuse must-dos for online qualitative researchers; 2012; Rossow, A.
- By the Numbers: Lessons for using online panels in B2B research; 2012; Elsner, N.
- Specialized Tools for Measuring Past Events ; 2012; Belli, R. F.
- Transparency, Access and the Credibility of Survey Research; 2012; Lupia, A.
- Can Microtargeting Improve Survey Sampling? An Assessment of Accuracy and Bias in Consumer File Marketing...; 2012; Pasek, J.
- Anonymity and Confidentiality; 2012; Tourangeau, R.
- Cognitive Evaluation of Survey Instruments: State of the Science (Art?) and Future Directions; 2012; Willis, G. B.
- Oh, Just One More Thing … Leveraging “Leave-Behinds” in Data Collection; 2012; Link, M. W.
- Paradata; 2012; Kreuter, F.
- Computation of Survey Weights: Bridging Theory and Practice; 2012; DeBell, M.
- Optimizing Response Rates; 2012; Brick, J. M.
- Modes of Data Collection; 2012; Tourangeau, R.
- The Use and Effects of Incentives in Surveys; 2012; Singer, E.
- Improving Question Design to Maximize Reliability and Validity; 2012; Krosnick, J. A.
- Respondent Attrition vs Data Attrition and Their Reduction; 2012; Olsen, R. J.
- Survey Interviewing: Deviations from the Script; 2012; Schaeffer, N. C.
- How accurate are surveys of objective phenomena?; 2012; Chang, L. C., Krosnick, J. A.
- Measure the response burden in the Swedish Intrastat system; 2012; Weideskog, F.
- Mode and non-response effects and their treatment; 2012; Chrysanthopoulos, S., Georgostathi, A.
- What can be said about quality in the Central Population Register based on a self-completion survey...; 2012; Falnes-Dalheim, E., Pedersen, H. E.
- Improving the quality of complex surveys: The case of the EU Labour Force Survey ; 2012; van der Valk, J.
- Pros and cons of Internet based User Satisfaction Surveys; 2012; Consoli, A., Matsulevits, L.
- Between demand and reality: Ensuring efficiency and quality in pretesting questionnaires; 2012; Sattelberger, S., Blanke, K.
- How to provide high data quality in online-questionnaires: Setting guidelines in design; 2012; Tries, S., Nebel, S., Blanke, K.
- Boosting Web pick-up Rates by referring to Compliance Principles ; 2012; Falnes-Dalheim, E., Haraldsen, G., Sundvoll, A.
- Choosing a Data Collection Approach: Mixed Mode Design Experiences in Statistics Finland; 2012; Taskinen, P., Kiianmaa, N.
- Ebook readings jumps, print book reading declines; 2012; Rainie, L., Duggan, M.
- Digital Divides: A connectivity continuum for the United States. Data from the 2011 Current Population...; 2012; File, T.
- How Should Debriefing Be Undertaken in Web-Based Studies? Findings From a Randomized Controlled Trial...; 2012; McCambridge, J., Kypri, K., Wilson, A.
- Better customer in sight in real time; 2012; Macdonald, E., Wilson, H. N., Konus, H.
- Best practices in data cleaning: A complete guide to everything you need to do before and after collecting...; 2012; Osborne, J. W.
- Benchmarking for better surveys; 2012; Nallan, S.
- Adult gadget ownership over time (2006-2012); 2012
- Subjective Well-being Of Spanish Workers: Continuous Voluntary Web Survey Examination; 2012; de Pedraza, P., Guzi, M.
- Specific mixed-mode methodology to reach sensory disabled people in quantitative surveys; 2012; Fontaine, S.
- Response Mode Choice and the Hard-to-Interview in the American Community Survey; 2012; Nichols, E. M., Horwitz, R., Guarino Tancreto, J.
- Recruiting in an Internet panel using respondent driven sampling; 2012; Schonlau, M.
- A Choice in Mode: A Solution for Increasing Response Rates of Hard-to-Survey Populations?; 2012; Haan, M., Ongena, Y. P.